Machine learning sparse tight-binding parameters for defects

نویسندگان

چکیده

Abstract We employ machine learning to derive tight-binding parametrizations for the electronic structure of defects. test several methods that map atomic and a defect onto sparse parameterization. Since Multi-layer perceptrons (i.e., feed-forward neural networks) perform best we adopt them our further investigations. demonstrate accuracy parameterizations range important properties such as band structure, local density states, transport level spacing simulations two common defects in single layer graphene. Our approach achieves results comparable maximally localized Wannier functions DFT accuracy) without prior knowledge about while also allowing reduced interaction which substantially reduces calculation time. It is general can be applied wide other materials, enabling accurate large-scale material presence different

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Transferable Orthogonal Tight Binding Parameters for CdS, CdSe and CdTe

Tight Binding (TB) method was first developed by Slater and Koster[1] for band structure calculations. With the advent of nanostructures, the method is experiencing revival due to its computational speed and capability of retaining the quantum mechanical description essential for electronic structure calculations. Despite being a parametric technique, TB method has added advantage of transferab...

متن کامل

Calculation for Energy of (111) Surfaces of Palladium in Tight Binding Model

In this work calculation of energetics of transition metal surfaces is presented. The tight-binding model is employed in order to calculate the energetics. The tight-binding basis set is limited to d orbitals which are valid for elements at the end of transition metals series. In our analysis we concentrated on electronic effects at temperature T=0 K, this means that no entropic term will be pr...

متن کامل

SparCML: High-Performance Sparse Communication for Machine Learning

One of the main drivers behind the rapid recent advances in machine learning has been the availability of efficient system support. This comes both through faster hardware, but also in the form of efficient software frameworks and programming models. Despite existing progress, scaling compute-intensive machine learning workloads to a large number of compute nodes is still a challenging task. In...

متن کامل

A dimension adaptive sparse grid combination technique for machine learning

We introduce a dimension adaptive sparse grid combination technique for the machine learning problems of classification and regression. A function over a d-dimensional space, which assumedly describes the relationship between the features and the response variable, is reconstructed using a linear combination of partial functions who possibly depend only on a subset of all features. The partial ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: npj computational materials

سال: 2022

ISSN: ['2057-3960']

DOI: https://doi.org/10.1038/s41524-022-00791-x